Natural information measures for contextual probabilistic models
نویسندگان
چکیده
My greatest concern was what to call it. I thought of calling it an ‘information’, but the word was overly used, so I decided to call it an ‘uncertainty’. When I discussed it with John von Neumann, he had a better idea. Von Neumann told me, ‘You should call it entropy, for two reasons. In the first place your uncertainty function has been used in statistical mechanics under that name, so it already has a name. In the second place, and more important, nobody knows what entropy really is, so in a debate you will always have an advantage’.[1] (see also [2], page 35).
منابع مشابه
Probabilistic concept verification for language understanding in spoken dialogue systems
In the past researches, several kinds of information have been explored to assess the confidence measure or to select the confidence tag for a word/phrase. However, the contextual confidence information is little touched. In this paper, we propose a concept-based probabilistic verification model to integrate the contextual confidence information. In this model, a concept is verified not only ac...
متن کاملA single-stage top-down probabilistic approach towards understanding spoken and handwritten mathematical formulas
We present a novel approach towards a multimodal analysis of natural speech and handwriting input for entering mathematical expressions into a computer. It utilizes an integrated, multilevel probabilistic architecture with a joint semantic and two distinct syntactic models describing speech and script properties, respectively. Compared to classical multistage solutions our single-stage strategy...
متن کاملImplicational Scaling of Reading Comprehension Construct: Is it Deterministic or Probabilistic?
In English as a Second Language Teaching and Testing situations, it is common to infer about learners’ reading ability based on his or her total score on a reading test. This assumes the unidimensional and reproducible nature of reading items. However, few researches have been conducted to probe the issue through psychometric analyses. In the present study, the IELTS exemplar module C (1994) wa...
متن کاملInvestigating how well contextual features are captured by bi-directional recurrent neural network models
Learning algorithms for natural language processing (NLP) tasks traditionally rely on manually defined appropriate contextual features. On the other hand, neural network models learn these features automatically and have been successfully applied for several NLP tasks. Such models only consider vector representation of words and thus do not require efforts for manual feature engineering. This m...
متن کاملMulti-Object Classification and Unsupervised Scene Understanding Using Deep Learning Features and Latent Tree Probabilistic Models
Deep learning has shown state-of-art classification performance on datasets such as ImageNet, which contain a single object in each image. However, multi-object classification is far more challenging. We present a unified framework which leverages the strengths of multiple machine learning methods, viz deep learning, probabilistic models and kernel methods to obtain state-of-art performance on ...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید
ثبت ناماگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید
ورودعنوان ژورنال:
- Quantum Information & Computation
دوره 16 شماره
صفحات -
تاریخ انتشار 2016